Fisher and Shannon Information in Finite Neural Populations
نویسندگان
چکیده
The precision of the neural code is commonly investigated using two families of statistical measures: Shannon mutual information and derived quantities when investigating very small populations of neurons and Fisher information when studying large populations. These statistical tools are no longer the preserve of theorists and are being applied by experimental research groups in the analysis of empirical data. Although the relationship between information-theoretic and Fisher-based measures in the limit of infinite populations is relatively well understood, how these measures compare in finite-size populations has not yet been systematically explored. We aim to close this gap. We are particularly interested in understanding which stimuli are best encoded by a given neuron within a population and how this depends on the chosen measure. We use a novel Monte Carlo approach to compute a stimulus-specific decomposition of the mutual information (the SSI) for populations of up to 256 neurons and show that Fisher information can be used to accurately estimate both mutual information and SSI for populations of the order of 100 neurons, even in the presence of biologically realistic variability, noise correlations, and experimentally relevant integration times. According to both measures, the stimuli that are best encoded are those falling at the flanks of the neuron's tuning curve. In populations of fewer than around 50 neurons, however, Fisher information can be misleading.
منابع مشابه
The Fisher-Shannon information plane, an electron correlation tool.
A new correlation measure, the product of the Shannon entropy power and the Fisher information of the electron density, is introduced by analyzing the Fisher-Shannon information plane of some two-electron systems (He-like ions, Hooke's atoms). The uncertainty and scaling properties of this information product are pointed out. In addition, the Fisher and Shannon measures of a finite many-electro...
متن کاملInformation Rates and Optimal Decoding in Large Neural Populations
Many fundamental questions in theoretical neuroscience involve optimal decoding and the computation of Shannon information rates in populations of spiking neurons. In this paper, we apply methods from the asymptotic theory of statistical inference to obtain a clearer analytical understanding of these quantities. We find that for large neural populations carrying a finite total amount of informa...
متن کاملNeurometric function analysis of population codes
The relative merits of different population coding schemes have mostly been analyzed in the framework of stimulus reconstruction using Fisher Information. Here, we consider the case of stimulus discrimination in a two alternative forced choice paradigm and compute neurometric functions in terms of the minimal discrimination error and the Jensen-Shannon information to study neural population cod...
متن کاملOptimum Design of Liquified Natural Gas Bi-lobe Tanks using Finite Element, Genetic Algorithm and Neural Network
A comprehensive set of ten artificial neural networks is developed to suggest optimal dimensions of type ‘C’ Bi-lobe tanks used in the shipping of liquefied natural gas. Multi-objective optimization technique considering the maximum capacity and minimum cost of vessels are implemented for determining optimum vessel dimensions. Generated populations from a genet...
متن کاملUnique Additive Information Measures- Boltzmann-gibbs-shannon, Fisher and Beyond
It is proved that the only additive and isotropic information measure that can depend on the probability distribution and also on its first derivative is a linear combination of the Boltzmann-Gibbs-Shannon and Fisher information measures. Further possibilities are investigated, too.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Neural computation
دوره 24 7 شماره
صفحات -
تاریخ انتشار 2012